251 research outputs found

    The Intellectual Advancement of Human-Computer Interaction Research: A Critical Assessment of the MIS Literature (1990-2008)

    Get PDF
    This paper assesses the intellectual advancement of Human-Computer Interaction (HCI) scholarship as one of the five research streams of the Management Information Systems (MIS) discipline. It particularly demonstrates the vitality and maturity that the HCI stream (or sub-discipline) has achieved in recent years, and adds to the few studies that draw an overarching picture of HCI. This study uses the same approach as that of Zhang and Li (2005), and delineates the intellectual development of HCI research in MIS by employing a multifaceted assessment of the published HCI articles over a period of 19 years (1990-2008) in eight primary MIS journals. In addition, this study includes several journal special issues and two book collections in the assessment. Twenty-four specific questions are addressed to answer the following five mega-research questions about the HCI sub-discipline: (1) What constitutes HCI’s intellectual substance? (2) What relationships does HCI have with other disciplines? (3) How is HCI evolving? (4) What are the patterns of HCI publication in the primary MIS journals? And, (5) Who are the contributing scholars? A number of areas for future research are predicted, along with a discussion of potential future directions for the sub-discipline. This study is of interest to researchers in the HCI sub-discipline, the MIS discipline, and other related disciplines to inform future research, collaboration, publication, and education. It should also be of interest to doctoral students for identifying potential topics for dissertation research and to identify academic institutions for future employment where such research is understood, appreciated, and encouraged

    An Intelligent Optimal Secure Framework for Malicious Events Prevention in IOT Cloud Networks

    Get PDF
    The intrusion is considered a significant problematic parameter in Cloud networks. Thus, an efficient mechanism is required to avoid intrusion and provide more security to the cloud system. Therefore, the novel Artificial Bee-based Elman Neural Security Framework (ABENSF) is developed in this article. The developed model rescales the raw dataset using the pre-processing function. Moreover, the artificial bee's optimal fitness function is integrated into the feature extraction phase to track and extract the attack features. In addition, the monitoring mechanism in the developed model provides high security to the network by preventing attacks. Thus, the tracking and monitoring functions avoid intrusion by eliminating known and unknown attacks. The presented work was designed and validated with an NSL-KDD dataset in python software. Finally, the performance parameters of the presented work are estimated and verified with the existing techniques in a comparative analysis. The comparative performance shows that the developed model has earned better outcomes than others

    The Intellectual Development of Human-Computer Interaction Research: A Critical Assessment of the MIS Literature (1990-2002)

    Get PDF
    As one of the five research streams of the Management Information Systems (MIS) discipline, Human-Computer Interaction (HCI) was predicted to resurge in the post-millennium era. To date, however, few studies have either synthesized existing studies or drawn an overarching picture of this sub-discipline. This study delineates the intellectual development of HCI research in MIS by a multifaceted assessment of the published HCI articles over a period of 13 years (1990-2002) in seven prime MIS journals: MISQ, ISR, JMIS, Decision Sciences, Management Science, DATA BASE, and JAIS. Twenty-two specific questions are addressed to answer the following five general research questions about the HCI sub-discipline: (1) What constitutes its intellectual substance? (2) What relationships does it have with other disciplines? (3) What are its recent evolutions? (4) What are the patterns of publishing HCI studies in the primary MIS journals? And, (5) Who are its contributing members? We use classification approach to address these questions. Descriptive analyses, including co-occurrence and cross-facet analyses, depict the key relationships. Trend analyses demonstrate recent evolutions. We present number of areas for future research, along with a discussion of potential future directions for the sub-discipline. This study should be of interest to researchers in this sub-discipline, in the MIS discipline, and in other related disciplines for future research, collaboration, publication, and education. It should also be of interest to doctoral students to identify potential research topics for dissertation research and to identify academic institutions for future employment where such research is understood, appreciated, and encouraged

    Effects of Uncertainty in Climate Inputs on Simulated Evapotranspiration and Runoff in the Western Arctic

    Get PDF
    Hydrological models require accurate precipitation and air temperature inputs in order to adequately depict water fluxes and storages across Arctic regions. Biases such as gauge undercatch, as well as uncertainties in numerical weather prediction reanalysis data that propagate through water budget models, limit the ability to accurately model the terrestrial arctic water cycle. A hydrological model forced with three climate datasets and three methods of estimating potential evapotranspiration (PET) was used to better understand the impact of these processes on simulated water fluxes across the Western Arctic Linkage Experiment (WALE) domain. Climate data were drawn from the NCEP–NCAR reanalysis (NNR) (NCEP1), a modified version of the NNR (NCEP2), and the Willmott–Matsuura (WM) dataset. PET methods applied in the model were Hamon, Penman–Monteith, and Penman–Monteith using adjusted vapor pressure data. High vapor pressures in the NNR lead to low simulated evapotranspiration (ET) in model runs using the Penman–Monteith PET method, resulting in increased runoff. Annual ET derived from simulations using Penman–Monteith PET was half the magnitude of ET simulated when the Hamon method was used. Adjustments made to the reanalysis vapor pressure data increased the simulated ET flux, reducing simulated runoff. Using the NCEP2 or WM climate data, along with the Penman–Monteith PET function, results in agreement to within 7% between the simulated and observed runoff across the Yukon River basin. The results reveal the high degree of uncertainty present in climate data and the range of water fluxes generated from common model drivers. This suggests the need for thorough evaluations of model requirements and potential biases in forcing data, as well as corroborations with observed data, in all efforts to simulate arctic water balances

    The tension between fire risk and carbon storage: evaluating U.S. carbon and fire management strategies through ecosystem models

    Get PDF
    Fire risk and carbon storage are related environmental issues because fire reduction results in carbon storage through the buildup of woody vegetation, and stored carbon is a fuel for fires. The sustainability of the U.S. carbon sink and the extent of fire activity in the next 100 yr depend in part on the type and effectiveness of fire reduction employed. Previous studies have bracketed the range of dynamics from continued fire reduction to the complete failure of fire reduction activities. To improve these estimates, it is necessary to explicitly account for fire reduction in terrestrial models. A new fire reduction submodel that estimates the spatiotemporal pattern of reduction across the United States was developed using gridded data on biomass, climate, land-use, population, and economic factors. To the authors’ knowledge, it is the first large-scale, gridded fire model that explicitly accounts for fire reduction. The model was calibrated to 1° × 1° burned area statistics [Global Burnt Area 2000 Project (GBA-2000)] and compared favorably to three important diagnostics. The model was then implemented in a spatially explicit ecosystem model and used to analyze 1620 scenarios of future fire risk and fire reduction strategies. Under scenarios of climate change and urbanization, burned area and carbon emissions both increased in scenarios where fire reduction efforts were not adjusted to match new patterns of fire risk. Fuel reducing management strategies reduced burned area and fire risk, but also limited carbon storage. These results suggest that to promote carbon storage and minimize fire risk in the future, fire reduction efforts will need to be increased and spatially adjusted and will need to employ a mixture of fuel-reducing and non-fuel-reducing strategies

    An integration of spreadsheet and project management software for cost optimal time scheduling in construction

    Get PDF
    Successful performance and completion of construction projects highly depend on an adequate time scheduling of the project activities. On implementation of time scheduling, the execution modes of activities are most often required to be set in a manner that enables in achieving the minimum total project cost. This paper presents an approach to cost optimal time scheduling, which integrates a spreadsheet application and data transfer to project management software (PMS). At this point, the optimization problem of project time scheduling is modelled employing Microsoft Excel and solved to optimality using Solver while organization of data is dealt by macros. Thereupon, Microsoft Project software is utilized for further managing and presentation of optimized time scheduling solution. In this way, the data flow between programs is automated and possibilities of error occurrence during scheduling process are reduced to a minimum. Moreover, integration of spreadsheet and PMS for cost optimal time scheduling in construction is performed within well-known program environment that increases the possibilities of its wider use in practice. An application example is shown in this paper to demonstrate the advantages of proposed approach

    CADISHI: Fast parallel calculation of particle-pair distance histograms on CPUs and GPUs

    Full text link
    We report on the design, implementation, optimization, and performance of the CADISHI software package, which calculates histograms of pair-distances of ensembles of particles on CPUs and GPUs. These histograms represent 2-point spatial correlation functions and are routinely calculated from simulations of soft and condensed matter, where they are referred to as radial distribution functions, and in the analysis of the spatial distributions of galaxies and galaxy clusters. Although conceptually simple, the calculation of radial distribution functions via distance binning requires the evaluation of O(N2)\mathcal{O}(N^2) particle-pair distances where NN is the number of particles under consideration. CADISHI provides fast parallel implementations of the distance histogram algorithm for the CPU and the GPU, written in templated C++ and CUDA. Orthorhombic and general triclinic periodic boxes are supported, in addition to the non-periodic case. The CPU kernels feature cache-blocking, vectorization and thread-parallelization to obtain high performance. The GPU kernels are tuned to exploit the memory and processor features of current GPUs, demonstrating histogramming rates of up to a factor 40 higher than on a high-end multi-core CPU. To enable high-throughput analyses of molecular dynamics trajectories, the compute kernels are driven by the Python-based CADISHI engine. It implements a producer-consumer data processing pattern and thereby enables the complete utilization of all the CPU and GPU resources available on a specific computer, independent of special libraries such as MPI, covering commodity systems up to high-end HPC nodes. Data input and output are performed efficiently via HDF5. (...) The CADISHI software is freely available under the MIT license.Comment: 19 page

    A Framework for Near Real-Time AFL Match Outcome Prediction

    Get PDF
    Sports analysis has always been a real talking point amongst both statisticians and sports personnel. However, the complexity of creating an efficient and accurate model coupled with the difficulties in acquiring in-game statistics have resulted in most research being focused on ex-ante result prediction. This research will present a framework for the real-time prediction of match outcomes at various strategic points within an Australian Football League (AFL) match

    Multi-Satellite Formation Trajectory Design with Topological Constraints over a Region of Interest using Differential Evolution

    Get PDF
    Satellite formation missions allow for scientific measurement opportunities that are only otherwise possible with the use of unrealistically large satellites. This work applies the Evolutionary Algorithm (EA), Differential Evolution (DE), to a 4-satellite mission design that borrows heavily from the mission specifications for Phase 1 of NASA\u27s Magnetospheric Multi-Scale Mission (MMS). This mission specifies goals for formation quality and size over the arc when scientific measurements are to be taken known as the Region of Interest (ROI). To apply DE to this problem a novel definition of fitness is developed and tailored to trajectory problems of the parameter scales of this mission. This method uses numerical integration of evolved initial conditions for trajectory determination. This approach allows for the inclusion of gravitational perturbations without altering the method. Here, the J2 oblateness correction is considered but other inclusions such as solar radiation pressure and other gravitational bodies are readily possible by amending the governing equations of integration which are stored outside of the method and called only during evaluation. A set of three launch conditions is evaluated using this method. Due to computational limitation, the design is restricted to only single-impulse maneuvers at launch and the ROI is initially restricted but then expanded through a process known here as staging . The ROIs of tests are expanded until they fail to meet performance criteria; no result was able to stage to the full MMS specified ±20∘\pm20^\circ ROI but this is a result of the single-impulse restriction. The number of orbits a launch condition is able to meet performance criteria is also investigated. Revolutions considered and the ROIs therein contained are staged to investigate if the method is able to handle this additional problem space. Evidence of suitable formation trajectories found by this method is here presented

    Synthetic industrial diamond : a technological outlook

    Get PDF
    Summary in English.Bibliography: pages 102-118.Synthetic diamonds are successfully substituting for natural diamonds in the area of industrial application. Synthetic diamonds increased their market share from 10% in 1960 to 50% in 1968 and to 90% in 1994. The success of synthetic diamonds may be ascribed largely to technological advance in the area of diamond manufacture. Two technologies in particular contributed to this advance: (i) High pressure and high temperature (HPHT) processes for crystallising carbon material and (ii) chemical vapour deposition (CVD) of these materials. The substitution of synthetic for natural diamond occurred in a systematic and predictable manner. Further technological advance could threaten the concept of diamond as a unique and desirable substance in the minds of the consumers and may require the repositioning of its image
    • 

    corecore